# Multilingual Mask Filling
Multilingual ModernBert Large Preview
MIT
A large multilingual BERT model developed by the Algomatic team, supporting 8192 context length, trained on approximately 60 billion tokens, suitable for mask filling tasks.
Large Language Model
M
makiart
27
2
Multilingual ModernBert Base Preview
MIT
A multilingual BERT model developed by the Algomatic team, supporting mask-filling tasks with an 8192 context length and a vocabulary of 151,680.
Large Language Model
Safetensors
M
makiart
60
4
Featured Recommended AI Models